Projected gradient descent algorithms for quantum state tomography

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Spectral Compressed Sensing via Projected Gradient Descent

Let x ∈ C be a spectrally sparse signal consisting of r complex sinusoids with or without damping. We consider the spectral compressed sensing problem, which is about reconstructing x from its partial revealed entries. By utilizing the low rank structure of the Hankel matrix corresponding to x, we develop a computationally efficient algorithm for this problem. The algorithm starts from an initi...

متن کامل

Material for “ Spectral Compressed Sensing via Projected Gradient Descent ”

We extend PGD and its recovery guarantee [1] from one-dimensional spectrally sparse signal recovery to the multi-dimensional case. Assume the underlying multi-dimensional spectrally sparse signal is of model order r and total dimension N . We show that O(r log(N)) measurements are sufficient for PGD to achieve successful recovery with high probability provided the underlying signal satisfies so...

متن کامل

CNN-Based Projected Gradient Descent for Consistent Image Reconstruction

We present a new method for image reconstruction which replaces the projector in a projected gradient descent (PGD) with a convolutional neural network (CNN). CNNs trained as high-dimensional (image-to-image) regressors have recently been used to efficiently solve inverse problems in imaging. However, these approaches lack a feedback mechanism to enforce that the reconstructed image is consiste...

متن کامل

Online Gradient Descent Learning Algorithms

This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without explicit regularization. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately c...

متن کامل

Boosting Algorithms as Gradient Descent

Much recent attention, both experimental and theoretical, has been focussed on classication algorithms which produce voted combinations of classi ers. Recent theoretical work has shown that the impressive generalization performance of algorithms like AdaBoost can be attributed to the classi er having large margins on the training data. We present an abstract algorithm for nding linear combinati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: npj Quantum Information

سال: 2017

ISSN: 2056-6387

DOI: 10.1038/s41534-017-0043-1